An approach for constructing parsimonious generalized Gaussian kernel regression models

نویسندگان

  • Xunxian Wang
  • Sheng Chen
  • David J. Brown
چکیده

The paper proposes a novel construction algorithm for generalized Gaussian kernel regression models. Each kernel regressor in the generalized Gaussian kernel regression model has an individual diagonal covariance matrix, which is determined by maximizing the correlation between the training data and the regressor using a repeated guided random search based on boosting optimization. The standard orthogonal least squares algorithm is then used to select a sparse generalized kernel regression model from the resulting full regression matrix. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modeling approach.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Regression Modelling Using an Incremental Weighted Optimization Method Based on Boosting with Correlation Criterion

ABSTRACT A novel technique is presented to construct sparse Gaussian regression models. Unlike most kernel regression modelling methods, which restrict kernel means to the training input data and use a fixed common variance for all the regressors, the proposed technique can tune the mean vector and diagonal covariance matrix of individual Gaussian regressor to best fit the training data based o...

متن کامل

Orthogonal-least-squares regression: A unified approach for data modelling

A unified approach is proposed for data modelling that includes supervised regression and classification applications as well as unsupervised probability density function estimation. The orthogonalleast-squares regression based on the leave-one-out test criteria is formulated within this unified data-modelling framework to construct sparse kernel models that generalise well. Examples from regre...

متن کامل

A Kernel Approach to Tractable Bayesian Nonparametrics

Inference in popular nonparametric Bayesian models typically relies on sampling or other approximations. This paper presents a general methodology for constructing novel tractable nonparametric Bayesian methods by applying the kernel trick to inference in a parametric Bayesian model. For example, Gaussian process regression can be derived this way from Bayesian linear regression. Despite the su...

متن کامل

Bayesian Generalized Kernel Mixed Models

We propose a fully Bayesian methodology for generalized kernel mixed models (GKMMs), which are extensions of generalized linear mixed models in the feature space induced by a reproducing kernel. We place a mixture of a point-mass distribution and Silverman’s g-prior on the regression vector of a generalized kernel model (GKM). This mixture prior allows a fraction of the components of the regres...

متن کامل

Sparse inverse kernel Gaussian Process regression

Regression problems on massive data sets are ubiquitous in many application domains including the Internet, earth and space sciences, and finances. Gaussian Process regression is a popular technique for modeling the input-output relations of a set of variables under the assumption that the weight vector has a Gaussian prior. However, it is challenging to apply Gaussian Process regression to lar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 62  شماره 

صفحات  -

تاریخ انتشار 2004